112 resultados para decision-making, decision modelling, value of information

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Contestants are predicted to adjust the cost of a fight in line with the perceived value of the resource and this provides a way of determining whether the resource has been assessed. An assessment of resource value is predicted to alter an animal's motivational state and we note different methods of measuring that state. We provide a categorical framework in which the degree of resource assessment may be evaluated and also note limitations of various approaches. We place studies in six categories: (1) cases of no assessment, (2) cases of internal state such as hunger influencing apparent value, (3) cases of the contestants differing in assessment ability, (4) cases of mutual and equal assessment of value, (5) cases where opponents differ in resource value and (6) cases of particularly complex assessment abilities that involve a comparison of the value of two resources. We examine the extent to which these studies support game theory predictions and suggest future areas of research. (C) 2008 The Association for the Study of Animal Behaviour. Published by Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An important issue in risk analysis is the distinction between epistemic and aleatory uncertainties. In this paper, the use of distinct representation formats for aleatory and epistemic uncertainties is advocated, the latter being modelled by sets of possible values. Modern uncertainty theories based on convex sets of probabilities are known to be instrumental for hybrid representations where aleatory and epistemic components of uncertainty remain distinct. Simple uncertainty representation techniques based on fuzzy intervals and p-boxes are used in practice. This paper outlines a risk analysis methodology from elicitation of knowledge about parameters to decision. It proposes an elicitation methodology where the chosen representation format depends on the nature and the amount of available information. Uncertainty propagation methods then blend Monte Carlo simulation and interval analysis techniques. Nevertheless, results provided by these techniques, often in terms of probability intervals, may be too complex to interpret for a decision-maker and we, therefore, propose to compute a unique indicator of the likelihood of risk, called confidence index. It explicitly accounts for the decisionmaker’s attitude in the face of ambiguity. This step takes place at the end of the risk analysis process, when no further collection of evidence is possible that might reduce the ambiguity due to epistemic uncertainty. This last feature stands in contrast with the Bayesian methodology, where epistemic uncertainties on input parameters are modelled by single subjective probabilities at the beginning of the risk analysis process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The issue of inherited disorders in pedigree dogs is not a recent phenomenon and reports of suspected genetic defects associated with breeding practices date back to Charles Darwin's time. In recent years, much information on the array of inherited defects has been assimilated and the true extent of the problem has come to light. Historically, the direction of research funding in the field of canine genetic disease has been largely influenced by the potential transferability of findings to human medicine, economic benefit and importance of dogs for working purposes. More recently, the argument for a more canine welfare-orientated approach has been made, targeting research efforts at the alleviation of the most suffering in the greatest number of animals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a new algorithm for exactly solving decision-making problems represented as an influence diagram. We do not require the usual assumptions of no forgetting and regularity, which allows us to solve problems with limited information. The algorithm, which implements a sophisticated variable elimination procedure, is empirically shown to outperform a state-of-the-art algorithm in randomly generated problems of up to 150 variables and 10^64 strategies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DESIGN We will address our research objectives by searching the published and unpublished literature and conducting an evidence synthesis of i) studies of the effectiveness of psychosocial interventions provided for children and adolescents who have suffered maltreatment, ii) economic evaluations of these interventions and iii) studies of their acceptability to children, adolescents and their carers. SEARCH STRATEGY: Evidence will be identified via electronic databases for health and allied health literature, social sciences and social welfare, education and other evidence based depositories, and economic databases. We will identify material generated by user-led,voluntary sector enquiry by searching the internet and browsing the websites of relevant UK government departments and charities. Additionally, studies will be identified via the bibliographies of retrieved articles/reviews; targeted author searches; forward citation searching. We will also use our extensive professional networks, and our planned consultations with key stakeholders and our study steering committee. Databases will be searched from inception to time of search. REVIEW STRATEGY Inclusion criteria: 1) Infants, children or adolescents who have experienced maltreatment between the ages of 0 17 years. 2) All psychosocial interventions available for maltreated children and adolescents, by any provider and in any setting, aiming to address the sequelae of any form of maltreatment, including fabricated illness. 3) For synthesis of evidence of effectiveness: all controlled studies in which psychosocial interventions are compared with no-treatment, treatment as usual, waitlist or other-treated controls. For a synthesis of evidence of acceptability we will include any design that asks participants for their views or provides data on non-participation. For decision-analytic modelling we may include uncontrolled studies. Primary and secondary outcomes will be confirmed in consultation with stakeholders. Provisional primary outcomes are psychological distress/mental health (particularly PTSD, depression and anxiety, self-harm); ii) behaviour; iii) social functioning; iv) cognitive / academic attainment, v) quality of life, and vi) costs. After studies that meet the inclusion criteria have been identified (independently by two reviewers), data will be extracted and risk of bias (RoB) assessed (independently by two reviewers) using the Cochrane Collaboration RoB Tool (effectiveness), quality hierarchies of data sources for economic analyses (cost-effectiveness) and the CASP tool for qualitative research (acceptability). Where interventions are similar and appropriate data are available (or can be obtained) evidence synthesis will be performed to pool the results. Where possible, we will explore the extent to which age, maltreatment history (including whether intra- or extra-familial), time since maltreatment, care setting (family / out-of-home care including foster care/residential), care history, and characteristics of intervention (type, setting, provider, duration) moderate the effects of psychosocial interventions. A synthesis of acceptability data will be undertaken, using a narrative approach to synthesis. A decision-analytic model will be constructed to compare the expected cost-effectiveness of the different types of intervention identified in the systematic review. We will also conduct a Value of information analysis if the data permit. EXPECTED OUTPUTS: A synthesis of the effectiveness and cost effectiveness of psychosocial interventions for maltreated children (taking into account age, maltreatment profile and setting) and their acceptability to key stakeholders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although it is well known that sandstone porosity and permeability are controlled by a range of parameters such as grain size and sorting, amount, type, and location of diagenetic cements, extent and type of compaction, and the generation of intergranular and intragranular secondary porosity, it is less constrained how these controlling parameters link up in rock volumes (within and between beds) and how they spatially interact to determine porosity and permeability. To address these unknowns, this study examined Triassic fluvial sandstone outcrops from the UK using field logging, probe permeametry of 200 points, and sampling at 100 points on a gridded rock surface. These field observations were supplemented by laser particle-size analysis, thin-section point-count analysis of primary and diagenetic mineralogy, quantitiative XRD mineral analysis, and SEM/EDAX analysis of all 100 samples. These data were analyzed using global regression, variography, kriging, conditional simulation, and geographically weighted regression to examine the spatial relationships between porosity and permeability and their potential controls. The results of bivariate analysis (global regression) of the entire outcrop dataset indicate only a weak correlation between both permeability porosity and their diagenetic and depositional controls and provide very limited information on the role of primary textural structures such as grain size and sorting. Subdividing the dataset further by bedding unit revealed details of more local controls on porosity and permeability. An alternative geostatistical approach combined with a local modelling technique (geographically weighted regression; GWR) subsequently was used to examine the spatial variability of porosity and permeability and their controls. The use of GWR does not require prior knowledge of divisions between bedding units, but the results from GWR broadly concur with results of regression analysis by bedding unit and provide much greater clarity of how porosity and permeability and their controls vary laterally and vertically. The close relationship between depositional lithofacies in each bed, diagenesis, and permeability, porosity demonstrates that each influences the other, and in turn how understanding of reservoir properties is enhanced by integration of paleoenvironmental reconstruction, stratigraphy, mineralogy, and geostatistics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Much interest now focuses on the use of the contingent valuation method (CVM) to assess non-use value of environmental goods. The paper reviews recent literature and highlights particular problems of information provision and respondent knowledge, comprehension and cognition. These must be dealt with by economists in designing CVM surveys for eliciting non-use values. Cognitive questionnaire design methods are presented which invoke concepts from psychology and tools from cognitive survey design (focus groups and verbal reports) to reduce a complex environmnetal good into a meaningful commodity that can be valued by respondents in a contingent market. This process is illustrated with examples from the authors' own research valuing alternative afforestation programmes. -Authors

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose
Information science has been conceptualized as a partly unreflexive response to developments in information and computer technology, and, most powerfully, as part of the gestalt of the computer. The computer was viewed as an historical accident in the original formulation of the gestalt. An alternative, and timely, approach to understanding, and then dissolving, the gestalt would be to address the motivating technology directly, fully recognizing it as a radical human construction. This paper aims to address the issues.

Design/methodology/approach
– The paper adopts a social epistemological perspective and is concerned with collective, rather than primarily individual, ways of knowing.

Findings
Information technology tends to be received as objectively given, autonomously developing, and causing but not itself caused, by the language of discussions in information science. It has also been characterized as artificial, in the sense of unnatural, and sometimes as threatening. Attitudes to technology are implied, rather than explicit, and can appear weak when articulated, corresponding to collective repression.

Research limitations/implications
– Receiving technology as objectively given has an analogy with the Platonist view of mathematical propositions as discovered, in its exclusion of human activity, opening up the possibility of a comparable critique which insists on human agency.

Originality/value
– Apprehensions of information technology have been raised to consciousness, exposing their limitations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article draws on qualitative research that explores the concept of public value in the delivery of sport services by the organization Sport England. The research took place against a backdrop of shifting priorities following the award of the 2012 Olympic Games to London. It highlights the difficulties that exist in measuring the qualitative nature of the public value of sport and suggests there is a need to understand better the idea. Research with organizations involved alongside Sport England in the delivery of sport is described. This explores the potential to create a public value vision, how to measure it and how to focus public value on delivery beyond the aim of ‘sport for sports sake’ and more towards ‘sport for the greater good’. The article argues that this represents a game of ‘two halves’ in which the first half focuses on 2012 with the second half concerned with its legacy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Segregation measures have been applied in the study of many societies, and traditionally such measures have been used to assess the degree of division between social and cultural groups across urban areas, wider regions, or perhaps national areas. The degree of segregation can vary substantially from place to place even within very small areas. In this paper the substantive concern is with religious/political segregation in Northern Ireland—particularly the proportion of Protestants (often taken as an indicator of those who wish to retain the union with Britain) to Catholics (often taken as an indicator of those who favour union with the Republic of Ireland). Traditionally, segregation is measured globally—that is, across all units in a given area. A recent trend in spatial data analysis generally, and in segregation analysis specifically, is to assess local features of spatial datasets. The rationale behind such approaches is that global methods may obscure important spatial variations in the property of interest, and thus prevent full use of the data. In this paper the utility of local measures of residential segregation is assessed with reference to the religious/political composition of Northern Ireland. The paper demonstrates marked spatial variations in the degree and nature of residential segregation across Northern Ireland. It is argued that local measures provide highly useful information in addition to that provided in maps of the raw variables and in standard global segregation measures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Large Fish Indicator (LFI) is a size-based indicator of fish community state. The indicator describes the proportion by biomass of a fish community represented by fish larger than some size threshold. From an observed peak value of 0.49 in 1990, the Celtic Sea LFI declined until about 2000 and then fluctuated around 0.10 throughout the 2000s. This decline in the LFI reflected a period of diminishing ‘large’ fish biomass, probably related to high levels of size selective fishing. During the study period, fishing mortality was maintained at consistently high values. Average biomass of ‘small’ fish fluctuated across the whole time series, showing a weak positive trend in recent years. Inter-annual variation in the LFI was increasingly driven by fluctuation in small fish biomass as large fish biomass declined. Simulations using a size-based ecosystem model suggested that recovery in Celtic Sea fish community size-structure (LFI) could demand at least 20% reductions in fishing pressure and occur on decadal timescales.